![]() OPTRONIC VISION EQUIPMENT FOR A TERRESTRIAL VEHICLE
专利摘要:
Optical vision equipment for equipping a land vehicle (3000) comprising: a panoramic image sensor (300), at least one steerable camera (310), having a better resolution in a smaller field of view than that of the image sensor; panoramic images and an image display device (330, 430); characterized in that it further comprises a data processor (320) configured or programmed to: receive at least a first image (201) of said panoramic image sensor and a second image (202) of said rotatable camera; from the first and second images, synthesizing a composite image (210) in which at least a portion of the second image (202) is embedded in a portion (2011) of the first image; and transmitting said composite image to the image display device. Armored vehicle (3000) equipped with such optronic vision equipment. Process implemented by such optronic equipment. 公开号:FR3052252A1 申请号:FR1600910 申请日:2016-06-07 公开日:2017-12-08 发明作者:Pascal Jerot;Dominique Bon;Ludovic Perruchot 申请人:Thales SA; IPC主号:
专利说明:
OPTRONIC VISION EQUIPMENT FOR A VEHICLE The invention relates to optronic vision equipment for a land vehicle, in particular an armored vehicle or a tank. It is known to equip such a vehicle with a plurality of detection / designation cameras operating in the visible or infrared, orientable and having a field of view of a few degrees-typically variable between 3 ° and 9 ° or between 4 ° and 12 °, possibly reaching 20 °. These cameras have a very good resolution, but their use is not easy because of their reduced field: it is difficult for the operator to locate the small region observed by such a camera in the environment in which the vehicle evolves. We talk about "straw effect" because it's as if the operator was looking through a straw. It is possible to overcome this disadvantage by modifying the optics of these cameras so as to allow their operation in modality with very large field (up to 40 ° - 45 °). The implementation of such a solution is expensive and, moreover, the passage of the camera in very large field mode does not allow to simultaneously have reduced field vision. The invention aims to overcome these disadvantages of the prior art. To do this, it exploits the fact that modern armored vehicles are often equipped with a very large-field vision system, for example a hemispherical sensor such as the "ANTARES" system of Thales, which provides vision on an azimuthal angle 360 ° and in a vertical arc from -15 ° to 75 °. This very wide field of view includes that of the detection / designation camera (s), at least for certain orientation ranges of the latter. An idea underlying the invention is then to combine in the same display an image portion acquired by such a very large field vision system and an image acquired by a detection camera / designation. In accordance with the invention, the high-resolution, restricted-field image from the detection / designation camera is embedded in a lower resolution and larger field image portion from the vision system. This produces a synthetic image corresponding to that which would be acquired by a virtual camera having the same orientation as the detection / designation camera, but a wider field of view. Advantageously, the user can zoom in to exploit the high resolution of the detection / designation camera, or zoom out to enlarge his field of vision and, for example, identify landmarks. Thus, an object of the invention is optronic vision equipment intended to equip a land vehicle comprising: a panoramic image sensor; at least one steerable camera, having a better resolution in a field of view smaller than that of the panoramic image sensor and contained therein for at least one set of orientations of the camera; and an image display device; characterized in that it further comprises a data processor configured or programmed to: receive at least a first image of said panoramic image sensor and a second image of said rotatable camera; from the first and the second image, synthesizing a composite image in which at least a portion of the second image is embedded in a portion of the first image; and transmitting said composite image to the image display device. According to particular embodiments of such equipment: The data processor may be configured or programmed to synthesize said composite image to match an image that would be acquired by a virtual camera having the same orientation as said steerable camera, but a wider field of view. More particularly, the data processor may be configured or programmed to change the width of the field of view of said composite image in response to a command from a user. The data processor may be configured or programmed to synthesize in real time a stream of said composite images from a stream of said first images and a stream of said second images. The image display device may be a portable display device equipped with orientation sensors, the equipment also comprising a system for controlling the orientation of the camera to that of the portable display device. The panoramic image sensor and the steerable camera can be adapted to operate in different spectral ranges. The panoramic image sensor can be a hemispheric sensor. The steerable camera may have a field of view, possibly variable, between 1 ° and 20 ° and preferably between 3 ° and 12 °. Another object of the invention is an armored vehicle provided with such optronic vision equipment. Yet another object of the invention is a method implemented by such optronic equipment, comprising the following steps: receiving a first image of a panoramic image sensor; receiving a second image of a steerable camera, said second image having a better resolution in a field of view smaller than that of the first image and contained therein; from the first and the second image, synthesizing a composite image in which at least a portion of the second image is embedded in a portion of the first image; and displaying said composite image. According to particular embodiments of such a method: Said composite image may correspond to an image that would be acquired by a virtual camera having the same orientation as said steerable camera, but a wider field of view. The method may further include the step of changing the width of the field of view of said composite image in response to a command from a user. A stream of said composite images can be synthesized in real time from a stream of said first images and a stream of second images. Said composite image can be displayed on a portable display device equipped with orientation sensors, the method also comprising the following steps: determining the orientation of said portable display device from signals generated by said sensors; and enslave the orientation of the camera to that of the portable display device. Other features, details and advantages of the invention will become apparent on reading the description given with reference to the accompanying drawings given by way of example and in which: FIG. 1 is a schematic representation of the principle of the invention; FIGS. 2A to 2E illustrate different images displayed during an implementation of the invention; and Figures 3 and 4 schematically illustrate two optronic equipment according to respective embodiments of the invention. In this document: The expression "detection / designation camera" indicates a steerable digital camera, having a relatively small field of view - typically less than or equal to 12 ° or even 15 °, but sometimes reaching 20 °, both in the azimuthal plane only according to a vertical arc. A detection / designation camera can operate in the visible spectrum, in the near infrared (night vision), in the medium or far infrared (thermal camera), or be multispectral or even hyperspectral. The expressions "very large field" and "panoramic", considered equivalent, designate a field of vision extending over at least 45 ° in the azimuthal plane, in a vertical arc or both. The term "hemispherical sensor" refers to an image sensor having a field of view extending 360 ° in the azimuth plane and at least 45 ° in a vertical arc. It may be a single sensor, for example using a hypergone lens ("fisheye"), or a composite sensor consisting of a set of cameras with a smaller field of view and a digital processor combining the images acquired. by these cameras. A hemispherical sensor is a special case of panoramic image sensor, or very large field. As has been explained above, one aspect of the invention consists in combining in the same display a portion of an image acquired by a hemispherical vision system (or more generally by a very large-field vision system) and an image. acquired by a detection / designation camera. This leads to the synthesis of one or more composite images that are displayed by means of one or more display devices, such as screens, virtual reality headsets, etc. When using equipment according to the invention, an operator can, for example, select a wide-field display mode - say 20 ° (in the azimuthal plane) x 15 ° (in a vertical arc). The selection is made with an appropriate interface tool: a keyboard, a wheel, a joystick, etc. A timely programmed data processor then selects a portion of an image from the hemispherical vision system, having the desired field width and oriented in the direction of view of the detection / naming camera. The image acquired by the detection / designation camera - corresponding for example to a field width of 9 ° x 6 °, is embedded in the center of this image, with the same magnification. This is illustrated by the left panel of FIG. 1, where the reference 100 designates the composite image, 101 corresponds to the outer part of this composite image, coming from the hemispherical vision system and providing a "context" allowing the operator to locate, while 102 designates its central portion from the detection camera / designation. Note that the elementary images 101 and 102 have the same magnification, so there is no break in continuity of the scene displayed by the composite image. If we ignore the fact that the elementary images 101 and 102 do not have the same resolution and can correspond to distinct spectral ranges, as well as a parallax defect that is difficult to avoid, the composite image corresponds to the image that would be acquired by a camera - which can be described as "virtual" - which would have the same orientation as the camera detection / designation but a wider field of view. If the operator zooms out further widening the field of view, the central portion 102 of the image narrows. If it zooms in, this central portion 102 widens at the expense of the outer portion 101, which is shown on the central panel of Figure 1, until the latter disappears (right panel of the figure). Reference is made here to a digital zoom feature, implemented by modifying the display. The detection / designation camera may also include an optical zoom; when optically zooming forward, however, the field width of the image from the camera decreases, so the central portion 102 of the composite image 100 narrows. A digital zoom applied to the composite image 100 can, if necessary, restore the dimensional ratio between the images 101 and 102. Advantageously, the detection / designation camera and the hemispherical vision system provide image streams at a rate of several frames per second. Preferably, these images are combined in real time or near real time, that is to say with a latency not exceeding 1 second and preferably 0.02 seconds (the latter value corresponding to the standard duration of a frame). FIGS. 2A to 2E illustrate in greater detail the composite images displayed on a screen 2 of an optronic equipment according to the invention. Figure 2A shows an overview of this screen. A strip 20 in the lower part of the screen corresponds to a first composite image obtained by combining the panoramic image 201 (360 ° in the azimuthal plane, from -15 ° to + 75 ° perpendicularly to this plane) issuing from the hemispheric sensor and an image 202 from a detection / designation camera, which in this case is an infrared camera. Concretely, a rectangle of the panoramic image 201 is replaced by the image 202 (or, in some cases, by a portion of this image). If the detection / designation camera is equipped with an optical zoom, the image 202 may have a variable size, but in any case it will occupy only a small part of the panoramic image 201. FIG. detail of this headband. Note that, in the case of a banner display, the composite image is not necessarily centered on the direction of view of the detection / designation camera. In the upper part 21 of the screen is displayed a second composite image 210 showing the image 202 from the detection / designation camera embedded in a context 2011 from the hemispheric image sensor, that is to say a portion 2011 of the panoramic image 201. The user can decide to activate a digital zoom (independent of the optical zoom of the detection / designation camera) to reduce the width of the field of view of the composite image 210. The image inlaid 202 therefore appears enlarged, to the detriment of the 2011 portion of the panoramic image. Figures 2C, 2D and 2E correspond to zooms more and more important. FIG. 2E, in particular, corresponds to the limiting case where only the image 202 resulting from the detection / designation camera is displayed (see the right panel of FIG. 1). Of course, the user can at any time decide to zoom back to benefit again from contextualization of the image. An advantage of the invention is to be able to benefit from both the strip display 20 with identification of the region observed by the detection / designation camera and the composite image with an intermediate field width 21. This would not be made possible by the sole use of an optical zoom on the detection / designation camera. Another part of the screen can be used to display detail views of the panoramic image 201. This is not directly related to the invention. FIG. 3 diagrammatically illustrates optronic equipment according to a first embodiment of the invention, installed on an armored vehicle 3000. The reference 300 designates the hemispherical image sensor mounted on the roof of the vehicle; 310 corresponds to the detection / designation camera, installed in a turret; 311 to a joystick type control device that allows an operator 350 to control the orientation of the camera 310 as well as its optical zoom; 320 denotes the data processor receiving image streams from the sensor 300 and the camera 310 and synthesizing one or more composite images that are displayed on the screen 330. The lower part of this figure shows these composite images 210, which have described above with reference to FIGS. 2A - 2E. FIG. 4 schematically illustrates optronic equipment according to a second embodiment of the invention, also installed on the armored vehicle 3000. This equipment differs from that of FIG. 3 in that the display screen and the device 311 are replaced by a portable display device 430, for example resembling a pair of binoculars, a virtual reality headset or a tablet. This device 430 is equipped with position and orientation sensors: gyroscopes, accelerometers, optical sensors ... to determine its position and especially its orientation relative to a frame of reference of the vehicle. A servo system 435 - some components of which may be common to the data processor 320 - slaves the orientation of the camera 310 to that of the device 430. The composite images displayed by the device are adapted in real time to the variations of orientation. . Thus, the operator 350 has the illusion of looking through the armor of the vehicle 3000 with a pair of binoculars with variable magnification. In addition, a screen may also be present, displaying, for example, a composite image with a very large field, such as the strip 20 illustrated in FIG. 2A. In the embodiments which have just been described, the optronic equipment comprises a single vision system with a very large field (a hemispherical sensor) and a single detection / designation camera. However, more generally such equipment may comprise several very large field vision systems, for example operating in different spectral regions and / or more detection / designation cameras can be oriented independently or not. Thus, several different composite images can be generated and displayed. The data processor may be a generic computer or a microprocessor card specialized in image processing, or even a dedicated digital electronic circuit. For the implementation of the invention, it executes image processing algorithms that are known by themselves.
权利要求:
Claims (14) [1" id="c-fr-0001] An optronic vision equipment for equipping a land vehicle (3000) comprising: a panoramic image sensor (300); at least one steerable camera (310), having a better resolution in a field of view smaller than that of the panoramic image sensor and contained therein for at least one set of orientations of the camera; and an image display device (330, 430); characterized in that it further comprises a data processor (320) configured or programmed to: receive at least a first image (201) of said panoramic image sensor and a second image (202) of said rotatable camera; from the first and second images, synthesizing a composite image (20, 210) in which at least a portion of the second image (202) is embedded in a portion (2011) of the first image; and transmitting said composite image to the image display device. [2" id="c-fr-0002] An opto vision equipment according to claim 1, wherein the data processor (320) is configured or programmed to synthesize said composite image (210) so that it matches an image that would be acquired by a virtual camera having the same orientation as said steerable camera, but a wider field of view. [3" id="c-fr-0003] The optronic vision equipment of claim 2 wherein the data processor (320) is configured or programmed to change the field of view width of said composite image (210) in response to a user command (350). ). [4" id="c-fr-0004] 4. optronic vision equipment according to one of the preceding claims wherein the data processor (320) is configured or programmed to synthesize in real time a stream of said composite images from a stream of said first images and of a stream of said second images. [5" id="c-fr-0005] 5. Optronic vision equipment according to one of the preceding claims wherein the image display device (430) is a portable display device equipped with orientation sensors, the equipment also comprising a servo system (435) of the orientation of the camera to that of the portable display device. [6" id="c-fr-0006] 6. Optronic vision equipment according to one of the preceding claims wherein the panoramic image sensor (300) and the steerable camera (310) are adapted to operate in different spectral ranges. [7" id="c-fr-0007] 7. Optronic vision equipment according to one of the preceding claims wherein the panoramic image sensor (300) is a hemispherical sensor. [8" id="c-fr-0008] 8. Optronic vision equipment according to one of the preceding claims wherein the steerable camera (310) has a field of view, optionally variable, between 1 ° and 20 ° and preferably between 3 ° and 12 °. [9" id="c-fr-0009] 9. Armored vehicle (3000) equipped with optronic vision equipment according to one of the preceding claims. [10" id="c-fr-0010] 10. A method implemented by an optoelectronic equipment according to one of the preceding claims, comprising the steps of: receiving a first image (201) of a sensor (300) of panoramic images; receiving a second image (202) of an orientable camera (310), said second image having a better resolution in a field of view smaller than that of the first image and contained therein; from the first and second images, synthesizing a composite image (210) in which at least a portion of the second image is embedded in a portion (2011) of the first image; and displaying said composite image. [11" id="c-fr-0011] 11. The method of claim 10 wherein said composite image (210) corresponds to an image that would be acquired by a virtual camera having the same orientation as said steerable camera, but a wider field of view. [12" id="c-fr-0012] The method of one of claims 10 or 11 further comprising the step of: changing the width of the field of view of said composite image (210) in response to a command from a user. [13" id="c-fr-0013] 13. Method according to one of claims 10 to 12, wherein a stream of said composite images is synthesized in real time from a stream of said first images and a stream of second images. [14" id="c-fr-0014] The method of one of claims 10 to 13 wherein said composite image is displayed on a portable display device (430) equipped with orientation sensors, the method also comprising the steps of: determining the orientation of said device portable display from signals generated by said sensors; and slaving the orientation of the camera (210) to that of the portable display device.
类似技术:
公开号 | 公开日 | 专利标题 EP3465086B1|2020-07-22|Optronic vision equipment for a land vehicle US20190056792A1|2019-02-21|Virtual reality head-mounted devices having reduced numbers of cameras, and methods of operating the same JP5934363B2|2016-06-15|Interactive screen browsing JP2015149634A|2015-08-20|Image display device and method US20190293943A1|2019-09-26|Situational awareness systems and methods US9102269B2|2015-08-11|Field of view matching video display system US9477085B2|2016-10-25|Head-mounted display and method of controlling the same US9304320B2|2016-04-05|Head-mounted display and method of controlling the same KR20190075045A|2019-06-28|Array detector for depth mapping JP2005509129A5|2007-01-11| FR2855355A1|2004-11-26|OPTICAL DETECTION SYSTEM FOR VEHICLES CA2957047C|2019-06-11|Active window for vehicle infomatics and virtual reality US10724872B2|2020-07-28|Vehicle navigation projection system and method thereof FR3025351A1|2016-03-04|ELECTRONIC CONTROL UNIT AND VIDEO SYSTEM IN A VEHICLE EP1676444B1|2009-10-07|Method and device for capturing a large-field image and region of interest thereof WO2013160255A1|2013-10-31|Display device suitable for providing an extended field of vision FR2936479A1|2010-04-02|CONTROL METHOD AND DRIVING ASSISTANCE SYSTEM THEREFOR EP2932464B1|2019-09-11|Method of display and system for aiding navigation WO2020084232A1|2020-04-30|Method for driving the display of information on a screen banner equipping a boat gangway JP2008242996A|2008-10-09|Information apparatus operation device and onboard electronic equipment KR101994616B1|2019-09-30|Apparatus and method for monitoring environment US11107195B1|2021-08-31|Motion blur and depth of field for immersive content production systems US10491791B2|2019-11-26|Imaging apparatus and image sensor EP3288259A1|2018-02-28|Array detector for depth mapping KR20180115140A|2018-10-22|Integrated vehicle and driving information display method and apparatus
同族专利:
公开号 | 公开日 WO2017211672A1|2017-12-14| JP2019526182A|2019-09-12| RU2722771C1|2020-06-03| CN109313025A|2019-02-05| EP3465086B1|2020-07-22| EP3465086A1|2019-04-10| FR3052252B1|2019-05-10| CA3026899A1|2017-12-14| IL263345A|2021-06-30| US20190126826A1|2019-05-02| IL263345D0|2018-12-31|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20020075258A1|1999-05-12|2002-06-20|Imove Inc.|Camera system with high resolution image inside a wide angle view| US20150071524A1|2013-09-11|2015-03-12|Motorola Mobility Llc|3D Feature Descriptors with Camera Pose Information| JPH11168754A|1997-12-03|1999-06-22|Mr System Kenkyusho:Kk|Image recording method, image database system, image recorder, and computer program storage medium| JP3571893B2|1997-12-03|2004-09-29|キヤノン株式会社|Image recording apparatus and image recording method, image database generating apparatus and image database generating method| US7292261B1|1999-08-20|2007-11-06|Patrick Teo|Virtual reality camera| US6853809B2|2001-01-30|2005-02-08|Koninklijke Philips Electronics N.V.|Camera system for providing instant switching between wide angle and full resolution views of a subject| JP2003134375A|2001-10-26|2003-05-09|Matsushita Electric Works Ltd|Image pickup system| US7995090B2|2003-07-28|2011-08-09|Fuji Xerox Co., Ltd.|Video enabled tele-presence control host| US9369679B2|2006-11-07|2016-06-14|The Board Of Trustees Of The Leland Stanford Junior University|System and process for projecting location-referenced panoramic images into a 3-D environment model and rendering panoramic images from arbitrary viewpoints within the 3-D environment model| IL189251D0|2008-02-05|2008-11-03|Ehud Gal|A manned mobile platforms interactive virtual window vision system| US8139205B2|2008-05-12|2012-03-20|Flir Systems, Inc.|Optical payload with integrated laser rangefinder and target designator| JP5935432B2|2012-03-22|2016-06-15|株式会社リコー|Image processing apparatus, image processing method, and imaging apparatus| JP2015115848A|2013-12-13|2015-06-22|セイコーエプソン株式会社|Head-mounted type display device and method for controlling head-mounted type display device| US9762815B2|2014-03-27|2017-09-12|Intel Corporation|Camera to capture multiple sub-images for generation of an image|DE102018125790A1|2018-10-17|2020-04-23|Rheinmetall Electronics Gmbh|Device for the validatable output of images| FR3100358B1|2019-08-27|2021-09-03|Thales Sa|Method for assisting in the detection of associated elements, device and platform| FR3100357B1|2019-08-27|2021-09-17|Thales Sa|Method for assisting in the detection of elements, associated device and platform|
法律状态:
2017-05-30| PLFP| Fee payment|Year of fee payment: 2 | 2017-12-08| PLSC| Search report ready|Effective date: 20171208 | 2018-05-29| PLFP| Fee payment|Year of fee payment: 3 | 2020-05-26| PLFP| Fee payment|Year of fee payment: 5 | 2021-05-27| PLFP| Fee payment|Year of fee payment: 6 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 FR1600910A|FR3052252B1|2016-06-07|2016-06-07|OPTRONIC VISION EQUIPMENT FOR A TERRESTRIAL VEHICLE| FR1600910|2016-06-07|FR1600910A| FR3052252B1|2016-06-07|2016-06-07|OPTRONIC VISION EQUIPMENT FOR A TERRESTRIAL VEHICLE| CA3026899A| CA3026899A1|2016-06-07|2017-06-01|Equipement optronique de vision pour un vehicule terrestre| JP2018563837A| JP2019526182A|2016-06-07|2017-06-01|Optoelectronic visual recognition device for land vehicles| US16/305,854| US20190126826A1|2016-06-07|2017-06-01|Optronic viewing device for a land vehicle| PCT/EP2017/063283| WO2017211672A1|2016-06-07|2017-06-01|Optronic viewing device for a land vehicle| CN201780035101.4A| CN109313025A|2016-06-07|2017-06-01|Photoelectron for land vehicle observes device| EP17730707.1A| EP3465086B1|2016-06-07|2017-06-01|Optronic vision equipment for a land vehicle| RU2018146526A| RU2722771C1|2016-06-07|2017-06-01|Optical-electronic surveillance device for ground vehicle| IL263345A| IL263345A|2016-06-07|2018-11-28|Optronic viewing device for a land vehicle| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|